71 research outputs found

    Non-systemic transmission of tick-borne diseases: a network approach

    Get PDF
    Tick-Borne diseases can be transmitted via non-systemic (NS) transmission. This occurs when tick gets the infection by co-feeding with infected ticks on the same host resulting in a direct pathogen transmission between the vectors, without infecting the host. This transmission is peculiar, as it does not require any systemic infection of the host. The NS transmission is the main efficient transmission for the persistence of the Tick-Borne Encephalitis virus in nature. By describing the heterogeneous ticks aggregation on hosts through a \hyphenation{dynamical} bipartite graphs representation, we are able to mathematically define the NS transmission and to depict the epidemiological conditions for the pathogen persistence. Despite the fact that the underlying network is largely fragmented, analytical and computational results show that the larger is the variability of the aggregation, and the easier is for the pathogen to persist in the population.Comment: 15 pages, 4 figures, to be published in Communications in Nonlinear Science and Numerical Simulatio

    Firsthand Opiates Abuse on Social Media: Monitoring Geospatial Patterns of Interest Through a Digital Cohort

    Get PDF
    In the last decade drug overdose deaths reached staggering proportions in the US. Besides the raw yearly deaths count that is worrisome per se, an alarming picture comes from the steep acceleration of such rate that increased by 21% from 2015 to 2016. While traditional public health surveillance suffers from its own biases and limitations, digital epidemiology offers a new lens to extract signals from Web and Social Media that might be complementary to official statistics. In this paper we present a computational approach to identify a digital cohort that might provide an updated and complementary view on the opioid crisis. We introduce an information retrieval algorithm suitable to identify relevant subspaces of discussion on social media, for mining data from users showing explicit interest in discussions about opioid consumption in Reddit. Moreover, despite the pseudonymous nature of the user base, almost 1.5 million users were geolocated at the US state level, resembling the census population distribution with a good agreement. A measure of prevalence of interest in opiate consumption has been estimated at the state level, producing a novel indicator with information that is not entirely encoded in the standard surveillance. Finally, we further provide a domain specific vocabulary containing informal lexicon and street nomenclature extracted by user-generated content that can be used by researchers and practitioners to implement novel digital public health surveillance methodologies for supporting policy makers in fighting the opioid epidemic.Comment: Proceedings of the 2019 World Wide Web Conference (WWW '19

    Optimizing surveillance for livestock disease spreading through animal movements

    Full text link
    The spatial propagation of many livestock infectious diseases critically depends on the animal movements among premises; so the knowledge of movement data may help us to detect, manage and control an outbreak. The identification of robust spreading features of the system is however hampered by the temporal dimension characterizing population interactions through movements. Traditional centrality measures do not provide relevant information as results strongly fluctuate in time and outbreak properties heavily depend on geotemporal initial conditions. By focusing on the case study of cattle displacements in Italy, we aim at characterizing livestock epidemics in terms of robust features useful for planning and control, to deal with temporal fluctuations, sensitivity to initial conditions and missing information during an outbreak. Through spatial disease simulations, we detect spreading paths that are stable across different initial conditions, allowing the clustering of the seeds and reducing the epidemic variability. Paths also allow us to identify premises, called sentinels, having a large probability of being infected and providing critical information on the outbreak origin, as encoded in the clusters. This novel procedure provides a general framework that can be applied to specific diseases, for aiding risk assessment analysis and informing the design of optimal surveillance systems.Comment: Supplementary Information at https://sites.google.com/site/paolobajardi/Home/archive/optimizing_surveillance_ESM_l.pdf?attredirects=

    Interplay of network dynamics and ties heterogeneity on spreading dynamics

    Get PDF
    The structure of a network dramatically affects the spreading phenomena unfolding upon it. The contact distribution of the nodes has long been recognized as the key ingredient in influencing the outbreak events. However, limited knowledge is currently available on the role of the weight of the edges on the persistence of a pathogen. At the same time, recent works showed a strong influence of temporal network dynamics on disease spreading. In this work we provide an analytical understanding, corroborated by numerical simulations, about the conditions for infected stable state in weighted networks. In particular, we reveal the role of heterogeneity of edge weights and of the dynamic assignment of weights on the ties in the network in driving the spread of the epidemic. In this context we show that when weights are dynamically assigned to ties in the network an heterogeneous distribution is able to hamper the diffusion of the disease, contrary to what happens when weights are fixed in time.Comment: 10 pages, 10 figure

    Epidemic spreading : the role of host mobility and transportation networks

    Get PDF
    In recent years, the increasing availability of computer power has enabled both to gather an unprecedented amount of data depicting the global interconnections of the modern society and to envision computational tools able to tackle the analysis and the modeling of dynamical pro- cesses unfolding on such a complex reality. In this perspective, the quantitative approach of Physics is catalyzing the growth of new interdisciplinary fields aimed at the understanding of complex techno-socio-ecological systems. By recognizing the crucial role of host mobility in the dissemination of infectious diseases and by leveraging on a network science approach to handle the large scale datasets describing the global interconnectivity, in this thesis we present a theo- retical and computational framework to simulate epidemics of emerging infectious diseases in real settings. In particular we will tackle two different public health related issues. First, we present a Global Epidemic and Mobility model (GLEaM) that is designed to simulate the spreading of an influenza-like illness at the global scale integrating real world-wide mobility data. The 2009 H1N1 pandemic demonstrated the need of mathematical models to provide epidemic forecasts and to assess the effectiveness of different intervention policies. In this perspective we present the results achieved in real time during the unfolding of the epidemic and a posteriori analysis on travel related mitigation strategies and model validation. The second problem that we address is related to the epidemic spreading on evolving networked systems. In particular we analyze a detailed dataset of livestock movements in order to characterize the temporal correlations and the statistical properties governing the system. We then study an infectious disease spreading, in order to characterize the vulnerability of the system and to design novel control strategies. This work is an interdisciplinary approach that merges statistical physics techniques, complex and multiscale system analysis in the context of hosts mobility and computational epidemiology.Ces dernières années, la puissance croissante des ordinateurs a permis a` la fois de rassembler une quantité sans précédent de données décrivant la société moderne et d’envisager des outils numériques capables de s’attaquer a` l’analyse et la modélisation les processus dynamiques qui se déroulent dans cette réalité complexe. Dans cette perspective, l’approche quantitative de la physique est un des catalyseurs de la croissance de nouveaux domaines interdisciplinaires visant a` la compréhension des systèmes complexes techno-sociaux. Dans cette thèse, nous présentons dans cette thèse un cadre théorique et numérique pour simuler des épidémies de maladies infectieuses émergentes dans des contextes réalistes. Dans ce but, nous utilisons le rôle crucial de la mobilité des agents dans la diffusion des maladies infectieuses et nous nous appuyons sur l’ étude des réseaux complexes pour gérer les ensembles de données à grande échelle décrivant les interconnexions de la population mondiale. En particulier, nous abordons deux différents probl`emes de sant ́e publique. Tout d’abord, nous consid ́erons la propagation d’une ́epid ́emie au niveau mondial, et pr ́esentons un mod`ele de mobilit ́e (GLEAM) conc ̧u pour simuler la propagation d’une maladie de type grippal a` l’ ́echelle globale, en int ́egrant des donn ́ees r ́eelles de mobilit ́e dans le monde entier. La derni`ere pand ́emie de grippe H1N1 2009 a d ́emontr ́e la n ́ecessit ́e de mod`eles math ́ematiques pour fournir des pr ́evisions ́epid ́emiques et ́evaluer l’efficacit ́e des politiques d’interventions. Dans cette perspective, nous pr ́esentons les r ́esultats obtenus en temps r ́eel pendant le d ́eroulement de l’ ́epid ́emie, ainsi qu’une analyse a posteriori portant sur les strat ́egies de lutte et sur la validation du mod`ele. Le deuxi`eme probl`eme que nous abordons est li ́e a` la propagation de l’ ́epid ́emie sur des syst`emes en r ́eseau d ́ependant du temps. En particulier, nous analysons des donn ́ees d ́ecrivant les mouvements du b ́etail en Italie afin de caract ́eriser les corr ́elations temporelles et les propri ́et ́es statistiques qui r ́egissent ce syst`eme. Nous étudions ensuite la propagation d’une maladie infectieuse, en vue de caractériser la vulnérabilité du syst`eme et de concevoir des strat ́egies de controˆle. Ce travail est une approche interdisciplinaire qui combine les techniques de la physique statistique et de l’analyse des syst`emes complexes dans le contexte de la mobilit ́e des agents et de l’ ́epid ́emiologie num ́erique

    To trust or not to trust an explanation: using LEAF to evaluate local linear XAI methods

    Get PDF
    The main objective of eXplainable Artificial Intelligence (XAI) is to provide effective explanations for black-box classifiers. The existing literature lists many desirable properties for explanations to be useful, but there is no consensus on how to quantitatively evaluate explanations in practice. Moreover, explanations are typically used only to inspect black-box models, and the proactive use of explanations as a decision support is generally overlooked. Among the many approaches to XAI, a widely adopted paradigm is Local Linear Explanations - with LIME and SHAP emerging as state-of-the-art methods. We show that these methods are plagued by many defects including unstable explanations, divergence of actual implementations from the promised theoretical properties, and explanations for the wrong label. This highlights the need to have standard and unbiased evaluation procedures for Local Linear Explanations in the XAI field. In this paper we address the problem of identifying a clear and unambiguous set of metrics for the evaluation of Local Linear Explanations. This set includes both existing and novel metrics defined specifically for this class of explanations. All metrics have been included in an open Python framework, named LEAF. The purpose of LEAF is to provide a reference for end users to evaluate explanations in a standardised and unbiased way, and to guide researchers towards developing improved explainable techniques.Comment: 16 pages, 8 figure

    Patterns of Routes of Administration and Drug Tampering for Nonmedical Opioid Consumption: Data Mining and Content Analysis of Reddit Discussions

    Get PDF
    The complex unfolding of the US opioid epidemic in the last 20 years has been the subject of a large body of medical and pharmacological research, and it has sparked a multidisciplinary discussion on how to implement interventions and policies to effectively control its impact on public health. This study leverages Reddit as the primary data source to investigate the opioid crisis. We aimed to find a large cohort of Reddit users interested in discussing the use of opioids, trace the temporal evolution of their interest, and extensively characterize patterns of the nonmedical consumption of opioids, with a focus on routes of administration and drug tampering. We used a semiautomatic information retrieval algorithm to identify subreddits discussing nonmedical opioid consumption, finding over 86,000 Reddit users potentially involved in firsthand opioid usage. We developed a methodology based on word embedding to select alternative colloquial and nonmedical terms referring to opioid substances, routes of administration, and drug-tampering methods. We modeled the preferences of adoption of substances and routes of administration, estimating their prevalence and temporal unfolding, observing relevant trends such as the surge in synthetic opioids like fentanyl and an increasing interest in rectal administration. Ultimately, through the evaluation of odds ratios based on co-mentions, we measured the strength of association between opioid substances, routes of administration, and drug tampering, finding evidence of understudied abusive behaviors like chewing fentanyl patches and dissolving buprenorphine sublingually. We believe that our approach may provide a novel perspective for a more comprehensive understanding of nonmedical abuse of opioids substances and inform the prevention, treatment, and control of the public health effects

    Streamlining models with explanations in the learning loop

    Get PDF
    Several explainable AI methods allow a Machine Learning user to get insights on the classification process of a black-box model in the form of local linear explanations. With such information, the user can judge which features are locally relevant for the classification outcome, and get an understanding of how the model reasons. Standard supervised learning processes are purely driven by the original features and target labels, without any feedback loop informed by the local relevance of the features identified by the post-hoc explanations. In this paper, we exploit this newly obtained information to design a feature engineering phase, where we combine explanations with feature values. To do so, we develop two different strategies, named Iterative Dataset Weighting and Targeted Replacement Values, which generate streamlined models that better mimic the explanation process presented to the user. We show how these streamlined models compare to the original black-box classifiers, in terms of accuracy and compactness of the newly produced explanations.Comment: 16 pages, 10 figures, available repositor
    corecore